95 research outputs found

    Non-Negative Local Sparse Coding for Subspace Clustering

    Full text link
    Subspace sparse coding (SSC) algorithms have proven to be beneficial to clustering problems. They provide an alternative data representation in which the underlying structure of the clusters can be better captured. However, most of the research in this area is mainly focused on enhancing the sparse coding part of the problem. In contrast, we introduce a novel objective term in our proposed SSC framework which focuses on the separability of data points in the coding space. We also provide mathematical insights into how this local-separability term improves the clustering result of the SSC framework. Our proposed non-linear local SSC algorithm (NLSSC) also benefits from the efficient choice of its sparsity terms and constraints. The NLSSC algorithm is also formulated in the kernel-based framework (NLKSSC) which can represent the nonlinear structure of data. In addition, we address the possibility of having redundancies in sparse coding results and its negative effect on graph-based clustering problems. We introduce the link-restore post-processing step to improve the representation graph of non-negative SSC algorithms such as ours. Empirical evaluations on well-known clustering benchmarks show that our proposed NLSSC framework results in better clusterings compared to the state-of-the-art baselines and demonstrate the effectiveness of the link-restore post-processing in improving the clustering accuracy via correcting the broken links of the representation graph.Comment: 15 pages, IDA 2018 conferenc

    Tensorized multi-view subspace representation learning

    Get PDF
    Self-representation based subspace learning has shown its effectiveness in many applications. In this paper, we promote the traditional subspace representation learning by simultaneously taking advantages of multiple views and prior constraint. Accordingly, we establish a novel algorithm termed as Tensorized Multi-view Subspace Representation Learning. To exploit different views, the subspace representation matrices of different views are regarded as a low-rank tensor, which effectively models the high-order correlations of multi-view data. To incorporate prior information, a constraint matrix is devised to guide the subspace representation learning within a unified framework. The subspace representation tensor equipped with a low-rank constraint models elegantly the complementary information among different views, reduces redundancy of subspace representations, and then improves the accuracy of subsequent tasks. We formulate the model with a tensor nuclear norm minimization problem constrained with â„“2,1-norm and linear equalities. The minimization problem is efficiently solved by using an Augmented Lagrangian Alternating Direction Minimization method. Extensive experimental results on diverse multi-view datasets demonstrate the effectiveness of our algorithm

    Sparse Subspace Clustering: Algorithm, Theory, and Applications

    No full text
    • …
    corecore